203 research outputs found

    Efficient Simulation Approaches for Reliability Analysis of Large Systems

    Get PDF

    POWER GRID ROBUSTENSS TO SEVERE FAILURES: TOPOLGICAL AND FLOW BASED METRICS COMPARISION

    Get PDF
    Power grids are generally regarded as very reliable systems, nevertheless outages and electricity shortfalls are common events and have the potential to produce significant social and economic consequences. It is important to reduce the likelihood of those severe accidents by assuring safe operations and robust topologies. The grid safety relies on accurate vulnerability measures, control schemes and good quality information. For instance, in power network operations, contingency analysis is used to constrain the network to secure operative states with respect to predefined failures (e.g. list of single component failures). An exhaustive failure list is often not treatable, therefore a selection or ranking is performed to help in the choice. In order to better understand the power network weakness and strengths a variety of robustness metrics have been introduced in literature, although many do not account or partially account for uncertainties which might affect the analysis. In this work power network vulnerability to failure events is analysed and single line outages (N-1 contingencies) have been ranked using different metrics (i.e. topology-based, flow-based and hybrid metrics). Sources of uncertainty such as power demand variability and lack of precise knowledge on the network parameters have been accounted for and its effect on the component ranking quantified. A modified version of the IEEE 118 bus power network has been selected as representative case study. The assumption underpinning the methodologies and the vulnerability results also accounting uncertainty are discussed

    Multi-objective reliability based design of complex engineering structures using response surface methods

    Get PDF
    Extensive research contributions have been carried out in the field of Reliability-Based Design Optimisation (RBDO). Traditional RBDO methods deal with a single objective optimisation problem subject to probabilistic constraints. However, realistic problems in engineering practice require a multi-criteria perspective where two or more conflicting objectives need to be optimised. These type of problems are solved with multi-objective optimization methods, known as Multi-Objective Reliability Based Design Optimization (MORBDO) methods. Usually, significant computational efforts are required to solve these types of problems due to the huge number of complex finite element model evaluations. This paper proposes a practical and efficient approach based for talking this challenge. A multiobjective evolutionary algorithms (MOEAs) is combined with response surface method to obtain efficiently, accurate and uniformly distributed Pareto front. The proposed approach has been implemented into the OpenCossan software. Two examples are presented to show the applicability of the approach: an analytical problem where one of the objectives is the system reliability and the classic 25 bars transmission tower

    Stochastic analysis and reliability-cost optimization of distributed generators and air source heat pumps

    Get PDF
    This paper presents a framework for stochastic analysis, simulation and optimisation of electric power grids combined with heat district networks. In this framework, distributed energy sources can be integrated within the grids and their performance is modelled. The effect of uncertain weather-operational conditions on the system cost and reliability is considered. A Monte Carlo Optimal Power Flow simulator is employed and statistical indicators of the system cost and reliability are obtained. Reliability and cost expectations are used to compare 4 different investments on heat pumps and electric power generators to be installed on a real-world grid. Generators' sizes and positions are analysed to reveal the sensitivity of the cost and reliability of the grid and an optimal investment problem is tackled by using a multi-objective genetic algorithm

    A hybrid load flow and event driven simulation approach to multi-state system reliability evaluation

    Get PDF
    Structural complexity of systems, coupled with their multi-state characteristics, renders their reliability and availability evaluation difficult. Notwithstanding the emergence of various techniques dedicated to complex multi-state system analysis, simulation remains the only approach applicable to realistic systems. However, most simulation algorithms are either system specific or limited to simple systems since they require enumerating all possible system states, defining the cut-sets associated with each state and monitoring their occurrence. In addition to being extremely tedious for large complex systems, state enumeration and cut-set definition require a detailed understanding of the system׳s failure mechanism. In this paper, a simple and generally applicable simulation approach, enhanced for multi-state systems of any topology is presented. Here, each component is defined as a Semi-Markov stochastic process and via discrete-event simulation, the operation of the system is mimicked. The principles of flow conservation are invoked to determine flow across the system for every performance level change of its components using the interior-point algorithm. This eliminates the need for cut-set definition and overcomes the limitations of existing techniques. The methodology can also be exploited to account for effects of transmission efficiency and loading restrictions of components on system reliability and performance. The principles and algorithms developed are applied to two numerical examples to demonstrate their applicability

    Rare event simulation in finite-infinite dimensional space

    Get PDF
    Modern engineering systems are becoming increasingly complex. Assessing their risk by simulation is intimately related to the efficient generation of rare failure events. Subset Simulation is an advanced Monte Carlo method for risk assessment and it has been applied in different disciplines. Pivotal to its success is the efficient generation of conditional failure samples, which is generally non-trivial. Conventionally an independent-component Markov Chain Monte Carlo (MCMC) algorithm is used, which is applicable to high dimensional problems (i.e., a large number of random variables) without suffering from ‘curse of dimension’. Experience suggests that the algorithm may perform even better for high dimensional problems. Motivated by this, for any given problem we construct an equivalent problem where each random variable is represented by an arbitrary (hence possibly infinite) number of ‘hidden’ variables. We study analytically the limiting behavior of the algorithm as the number of hidden variables increases indefinitely. This leads to a new algorithm that is more generic and offers greater flexibility and control. It coincides with an algorithm recently suggested by independent researchers, where a joint Gaussian distribution is imposed between the current sample and the candidate. The present work provides theoretical reasoning and insights into the algorithm

    Robust vulnerability analysis of nuclear facilities subject to external hazards

    Get PDF
    Natural hazards have the potential to trigger complex chains of events in technological installations leading to disastrous effects for the surrounding population and environment. The threat of climate change of worsening extreme weather events exacerbates the need for new models and novel methodologies able to capture the complexity of the natural-technological interaction in intuitive frameworks suitable for an interdisciplinary field such as that of risk analysis. This study proposes a novel approach for the quantification of risk exposure of nuclear facilities subject to extreme natural events. A Bayesian Network model, initially developed for the quantification of the risk of exposure from spent nuclear material stored in facilities subject to flooding hazards, is adapted and enhanced to include in the analysis the quantification of the uncertainty affecting the output due to the imprecision of data available and the aleatory nature of the variables involved. The model is applied to the analysis of the nuclear power station of Sizewell B in East Anglia (UK), through the use of a novel computational tool. The network proposed models the direct effect of extreme weather conditions on the facility along several time scenarios considering climate change predictions as well as the indirect effects of external hazards on the internal subsystems and the occurrence of human error. The main novelty of the study consists of the fully computational integration of Bayesian Networks with advanced Structural Reliability Methods, which allows to adequately represent both aleatory and epistemic aspects of the uncertainty affecting the input through the use of probabilistic models, intervals, imprecise random variables as well as probability bounds. The uncertainty affecting the output is quantified in order to attest the significance of the results and provide a complete and effective tool for risk-informed decision making

    Risk Assessment of Spent Nuclear Fuel Facilities Considering Climate Change

    Get PDF
    Natural hazards have the capability to affect technological installations, triggering multiple failures and putting the population and the surrounding environment at risk. Global climate change introduces an additional and not negligible element of uncertainty to the vulnerability quantification, threatening to intensify (both in terms of frequency and severity) the occurrence of extreme climate events. Sea level extremes and extreme coastal high waters are expected to change in the future as a result of both changes in atmospheric storminess and mean sea level rise, as well as extreme precipitation events. These trends clearly suggest a parallel increase in the risks affecting technological installations and the subsequent need for mitigation measures to enhance the reliability of existing systems and to improve the design standards of new facilities. In spite of this situation, the scientific research in this field lacks robust and reliable tools for this kind of assessment, often relying on the adoption of oversimplified models or strong assumptions, which affect the credibility of the results. The main purpose of this study is to provide a novel and general model for the evaluation of the risk of exposure of spent nuclear fuel stored in a facility subject to flood hazard, investigating the potential and limitations of Bayesian networks (BNs) in this field. The network aims to model the interaction between extreme weather conditions and the technological installation, as well as the propagation of failures within the system itself, taking into account the dependencies among the different components and the occurrence of human error. A real-world application concerning the nuclear power station of Sizewell B in East Anglia, in the United Kingdom, is extensively described, together with the models and data set used. Results are presented for three different time scenarios in which climate change projections have been adopted to estimate future risk

    A Bayesian model updating procedure for dynamic health monitoring

    Get PDF
    Structures under dynamic excitation can undergo phenomena of crack growth and fracture. For safety reasons, it is of key importance to be able to detect and classify these cracks before the unwarned structural failure. Additionally, the cracks will also change the dynamic behaviour of the structures, impacting their performance. Here, a Bayesian model updating procedure has been implemented for the crack detection location and length estimation on a numerical model of a spring suspension arm. A highfidelity finite element model has been used to simulate experimental data, by inserting cracks of different extent at different locations and obtaining reference frequency response functions. In the following, a low fidelity parametric model has been used in the Bayesian framework to infer the crack location and length by comparing the dynamic responses. It is shown that the proposed methodology can be successfully adopted as a structural health monitoring tool

    UNCERTAINTY QUANTIFICATION OF OPTIMAL THRESHOLD FAILURE PROBABILITY FOR PREDICTIVE MAINTENANCE USING CONFIDENCE STRUCTURES

    Get PDF
    This paper seeks to analyze the imprecision associated with the statistical modelling method employed in devising a predictive maintenance framework on a plasma etching chamber. During operations, the plasma etching chamber may fail due to contamination as a result of a high number of particles that is present. Based on a study done, the particle count is observed to follow a Negative Binomial distribution model and it is also used to model the probability of failure of the chamber. Using this model, an optimum threshold failure probability is determined in which maintenance is scheduled once this value is reached during the operation of the chamber and that the maintenance cost incurred is the lowest. One problem however is that the parameter(s) used to define the Negative Binomial distribution may have uncertainties associated with it in reality and this eventually gives rise to uncertainty in deciding the optimum threshold failure probability. To address this, the paper adopts the use of Confidence structures (or C-boxes) in quantifying the uncertainty of the optimum threshold failure probability. This is achieved by introducing some variations in the p-parameter of the Negative Binomial distribution and then plotting a series of Cost-rate vs threshold failure probability curves. Using the information provided in these curves, empirical cumulative distribution functions are constructed for the possible upper and lower bounds of the threshold failure probability and from there, the confidence interval for the aforementioned quantity will be determined at 50%, 80%, and 95% confidence level
    • …
    corecore